Goto

Collaborating Authors

 firing rate model


Firing Rate Models as Associative Memory: Excitatory-Inhibitory Balance for Robust Retrieval

Betteti, Simone, Baggio, Giacomo, Bullo, Francesco, Zampieri, Sandro

arXiv.org Artificial Intelligence

Firing rate models are dynamical systems widely used in applied and theoretical neuroscience to describe local cortical dynamics in neuronal populations. By providing a macroscopic perspective of neuronal activity, these models are essential for investigating oscillatory phenomena, chaotic behavior, and associative memory processes. Despite their widespread use, the application of firing rate models to associative memory networks has received limited mathematical exploration, and most existing studies are focused on specific models. Conversely, well-established associative memory designs, such as Hopfield networks, lack key biologically-relevant features intrinsic to firing rate models, including positivity and interpretable synaptic matrices that reflect excitatory and inhibitory interactions. To address this gap, we propose a general framework that ensures the emergence of re-scaled memory patterns as stable equilibria in the firing rate dynamics. Furthermore, we analyze the conditions under which the memories are locally and globally asymptotically stable, providing insights into constructing biologically-plausible and robust systems for associative memory retrieval.


Input-Driven Dynamics for Robust Memory Retrieval in Hopfield Networks

Betteti, Simone, Baggio, Giacomo, Bullo, Francesco, Zampieri, Sandro

arXiv.org Artificial Intelligence

The Hopfield model provides a mathematically idealized yet insightful framework for understanding the mechanisms of memory storage and retrieval in the human brain. This model has inspired four decades of extensive research on learning and retrieval dynamics, capacity estimates, and sequential transitions among memories. Notably, the role and impact of external inputs has been largely underexplored, from their effects on neural dynamics to how they facilitate effective memory retrieval. To bridge this gap, we propose a novel dynamical system framework in which the external input directly influences the neural synapses and shapes the energy landscape of the Hopfield model. This plasticity-based mechanism provides a clear energetic interpretation of the memory retrieval process and proves effective at correctly classifying highly mixed inputs. Furthermore, we integrate this model within the framework of modern Hopfield architectures, using this connection to elucidate how current and past information are combined during the retrieval process. Finally, we embed both the classic and the new model in an environment disrupted by noise and compare their robustness during memory retrieval.